Tensor Rank is NP-Complete
نویسنده
چکیده
We prove that computing the rank of a three-dimensional tensor over any nite eld is NP-complete. Over the rational numbers the problem is NP-hard. Warning: Essentially this paper has beenpublished in Journal of Algorithms and is hence subject to copyright restrictions. It is for personal use only. 1. Introduction. One of the most fundamental quantities in linear algebra is the rank of a matrix. This is a well understood, easy to compute number. The purpose of this paper is to study a higher dimensional analogue, namely the rank of a three-dimensional tensor. Let us deene this number before we continue. For comparison we rst give a slightly unusual deenition of matrix rank. A matrix is a two-dimensional array of numbers. It has rank 1 ii it can bewritten as the outer product of two vectors. By this we mean that there are vectors x and y such that m ij = x i y j. The rank of a general matrix M is now the minimal number of rank 1 matrices M i such that M = P M i. In the same way, a three-dimensional tensor is a three-dimensional array of numbers. It has rank 1 ii it can bewritten as the outer product of three vectors and the rank of a general tensor T is the minimal number of rank 1 tensors T i such t h a t T = P T i. Despite the fact that the rank of a tensor is a very natural object, our knowledge of its properties is surprisingly limited. For instance, it does not seem to be known in any eld what is the maximal rank of an n n n tensor. In this paper we prove that over most elds it is NP-hard to compute the rank of a tensor. Thus unless N P= P there will be no easily computable characterization of rank and furthermore if N P6 = coNP there will be no easy to verify characterization of the property \having rank at least r". These facts might explain at least partly the lack of progress in the study of tensor rank. One can here draw a parallel with graph theory where the NP-complete problem of Hamiltonian circuit has been much m o r e elusive than many other properties of graphs. In spite of the interesting and natural questions above, our main motivation to study tensor rank …
منابع مشابه
Nuclear norm of higher-order tensors
We establish several mathematical and computational properties of the nuclear norm for higher-order tensors. We show that like tensor rank, tensor nuclear norm is dependent on the choice of base field — the value of the nuclear norm of a real 3-tensor depends on whether we regard it as a real 3-tensor or a complex 3-tensor with real entries. We show that every tensor has a nuclear norm attainin...
متن کاملEfficient Sparse Low-Rank Tensor Completion Using the Frank-Wolfe Algorithm
Most tensor problems are NP-hard, and low-rank tensor completion is much more difficult than low-rank matrix completion. In this paper, we propose a time and spaceefficient low-rank tensor completion algorithm by using the scaled latent nuclear norm for regularization and the FrankWolfe (FW) algorithm for optimization. We show that all the steps can be performed efficiently. In particular, FW’s...
متن کاملLow-Rank Approximation and Completion of Positive Tensors
Unlike the matrix case, computing low-rank approximations of tensors is NP-hard and numerically ill-posed in general. Even the best rank-1 approximation of a tensor is NP-hard. In this paper, we use convex optimization to develop polynomial-time algorithms for low-rank approximation and completion of positive tensors. Our approach is to use algebraic topology to define a new (numerically well-p...
متن کامل0 Most Tensor Problems are NP - Hard
We prove that multilinear (tensor) analogues of many efficiently computable problems in numerical linear algebra are NP-hard. Our list here includes: determining the feasibility of a system of bilinear equations, deciding whether a 3-tensor possesses a given eigenvalue, singular value, or spectral norm; approximating an eigenvalue, eigenvector, singular vector, or the spectral norm; and determi...
متن کاملCross: Efficient Low-rank Tensor Completion
The completion of tensors, or high-order arrays, attracts significant attention in recent research. Current literature on tensor completion primarily focuses on recovery from a set of uniformly randomly measured entries, and the required number of measurements to achieve recovery is not guaranteed to be optimal. In addition, the implementation of some previous methods are NP-hard. In this artic...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- J. Algorithms
دوره 11 شماره
صفحات -
تاریخ انتشار 1989